Near-Optimal Conversion of Hardness into Pseudo-Randomness
نویسندگان
چکیده
Various efforts ([?, ?, ?]) have been made in recent years to derandomize probabilistic algorithms using the complexity theoretic assumption that there exists a problem in E = dtime(2), that requires circuits of size s(n), (for some function s). These results are based on the NW-generator [?]. For the strong lower bound s(n) = 2 , [?], and later [?] get the optimal derandomization, P = BPP . However, for weaker lower bound functions s(n), these constructions fall far short of the natural conjecture for optimal derandomization, namely that bptime(t) ⊆ dtime(2O(s(t))). The gap in these constructions is due to an inherent limitation on efficiency in NW-style pseudo-random generators. In this paper we are able to get derandomization in almost optimal time using any lower bound s(n). We do this by using the NW-generator in a new, more sophisticated way. We view any failure of the generator as a reduction from the given “hard” function to its restrictions on smaller input sizes. Thus, either the original construction works (almost) optimally, or one of the restricted functions is (almost) as hard as the original. Any such restriction can then be plugged into the NW-generator recursively. This process generates many “candidate” generators all are (almost) optimal, and at least one is guaranteed to be “good”. Then, to perform the approximation of the acceptance probability of the given circuit (which is the key to derandomization), we use ideas from [?]: we run a tournament between the “candidate” generators which yields an accurate estimate. Following Trevisan, we explore information theoretic analogs of our new construction. Trevisan [?] (and then [?]) used the NW-generator to construct efficient extractors. However, the inherent limitation of the NW-generator mentioned above makes the extra randomness required by that extractor suboptimal (for certain parameters). Applying our construction, we show how to use a weak random souce with optimal amount of extra randomness, for the (simpler than extraction) task of estimating the probability of any event (which is given by an oracle).
منابع مشابه
Understanding the Impact of Pseudo Randomness on Internet Worm Propagation
Recent research has shown that worms continue to remain among the significant infectors of the Internet [1]. Our understanding of worms’ propagation and malicious potential must keep in pace with their rapid evolution. In this paper, we point out a fundamental flaw in the past worm analysis — ignoring the importance of pseudo randomness employed in the worm code. The impact of pseudo randomness...
متن کاملReducing The Seed Length In The Nisan-Wigderson Generator
The Nisan-Wigderson pseudo-random generator [NW94] was constructed to derandomize probabilistic algorithms under the assumption that there exist explicit functions which are hard for small circuits. We give the first explicit construction of a pseudo-random generator with asymptotically optimal seed length even when given a function which is hard for relatively small circuits. Generators with o...
متن کاملSoybean Oil Transesterification Reactions in the Presence of Mussel Shell: Pseudo-First Order Kinetics
Calcium oxide is one of the appropriate catalysts for biodiesel production. In this study, cheap and compatible with environment catalyst has been used. Mussel shell of Persian Gulf coast is one of the sources of calcium carbonate that is converted to calcium oxide at calcination temperature up to 950°C. Transesterification reaction was carried out at optimum condition of our previous study...
متن کاملPseudo-randomness and partial information in symbolic security analysis
We prove computational soundness results for cryptographic expressions with pseudo-random keys, as used, for example, in the design and analysis of secure multicast key distribution protocols. In particular, we establish a symbolic notion of independence (for pseudo-random keys) that exactly matches the standard computational security definition (namely, indistinguishability from the uniform di...
متن کاملDo Probabilistic Algorithms Outperform Deterministic Ones?
The introduction of randomization into efficient computation has been one of the most fertile and usefifl ide,'~q in computer science. In cryl)tography and ,~synchronous comlmting, randomization makes possil)le t.asks that are iml)ossilfle to l)erform detcrnfinistically. For fimction coml)utation , many examples are known in which randomization allows considerable savings in resources like spac...
متن کامل